🚀Burnchi的小破站

🚀Burnchi

切换主题

🌙
🔆
原创

poe api

Burnchi
5851
12min
2024-11-23

插件说明

https://github.com/snowby666/poe-api-wrapper

安装库

bash
pip install -U poe-api-wrapper

快速使用

每次运行都会新建一个会话!

  1. 新建一个py文件(名字随意)
  2. 粘贴以下内容
bash
from poe_api_wrapper import AsyncPoeApi import asyncio # 需要改 tokens = { 'p-b': 'xxx', 'p-lat': 'xxx', } # 使用什么bot bot = "gpt3_5" # 提问prompt message = """ 鲁迅是周树人吗? """ async def main(): client = await AsyncPoeApi(tokens=tokens).create() print("=" * 50) print(f"输入的prompt:{message.strip()}") print("=" * 50) async for chunk in client.send_message(bot=bot, message=message): print(chunk["response"], end='', flush=True) asyncio.run(main())

(这里使用的是GPT-3.5-Turbo-Raw,需要使用其他大模型请看下面表格)

注意:这里需要修改token,tokens需要到poe官网获取!

20241022152156.png

p-b和p-lat获取

F12 打开开发者工具 (或右键,检查)

  • Chromium: Devtools > Application > Cookies > poe.com
  • Firefox: Devtools > Storage > Cookies
  • Safari: Devtools > Storage > Cookies

formkey获取

F12 for Devtools (Right-click + Inspect)

  • 1st Method: Devtools > Network > gql_POST > Headers > Poe-Formkey Copy the value of Poe-Formkey
20241022165524.png
  1. 运行py文件
bash
python xxx.py

即可返回答案

20241022153049.png

其他大模型

Display NameModelToken LimitWords
Assistantcapybara4K3K
Claude-3.5-Sonnetclaude_3_igloo4K3K
Claude-3-Opusclaude_2_1_cedar4K3K
Claude-3-Sonnetclaude_2_1_bamboo4K3K
Claude-3-Haikuclaude_3_haiku4K3K
Claude-3.5-Sonnet-200kclaude_3_igloo_200k200K150K
Claude-3-Opus-200kclaude_3_opus_200k200K150K
Claude-3-Sonnet-200kclaude_3_sonnet_200k200K150K
Claude-3-Haiku-200kclaude_3_haiku_200k200K150K
Claude-2claude_2_short4K3K
Claude-2-100ka2_2100K75K
Claude-instanta29K7K
Claude-instant-100ka2_100k100K75K
GPT-3.5-Turbochinchilla4K3K
GPT-3.5-Turbo-Rawgpt3_52k1.5K
GPT-3.5-Turbo-Instructchinchilla_instruct2K1.5K
ChatGPT-16kagouti16K12K
GPT-4-Classicgpt4_classic2K1.5K
GPT-4-Turbobeaver4K3K
GPT-4-Turbo-128kvizcacha128K96K
GPT-4ogpt4_o4k3K
GPT-4o-128kgpt4_o_128k128K96K
GPT-4o-Minigpt4_o_mini4K3K
GPT-4o-Mini-128kgpt4_o_mini_128k128K96K
Google-PaLMacouchy8K6K
Code-Llama-13bcode_llama_13b_instruct4K3K
Code-Llama-34bcode_llama_34b_instruct4K3K
Solar-Miniupstage_solar_0_70b_16bit2K1.5K
Gemini-1.5-Flash-Searchgemini_pro_search4K3K
Gemini-1.5-Pro-2Mgemini_1_5_pro_1m2M1.5M

openai api使用

安装额外包

bash
pip install -U 'poe-api-wrapper[llm]'

可能出现的报错 ModuleNotFoundError: No module named 'openai'

bash
D:\poe-api-wrapper\poe_api_wrapper\openai>pip install openai

Connecting to the API

获取客户端对象(后面都需要这个对象进行操作)

bash
tokens = { 'p-b': 'xxx', 'p-lat': 'xxx', 'formkey': 'xxx' } # Default setup from poe_api_wrapper import PoeApi client = PoeApi(tokens=tokens) # Using Client with auto_proxy (default is False) client = PoeApi(tokens=tokens, auto_proxy=False) # Passing proxies manually proxy_context = [ {"https://127.0.0.1:8000", "http://127.0.0.1:8000"}, ] client = PoeApi(tokens=tokens, proxy=proxy_context)

如果连接成功,控制台会显示

bash
2024-10-22 16:48:55.793 | INFO | poe_api_wrapper.api:select_proxy:109 - Connection established with {'https://127.0.0.1:8000', 'http://127.0.0.1:8000'}

获取聊天ID &聊天代码

获取所有bot的聊天数据

bash
print(client.get_chat_history()['data'])

获取某个bot的聊天数据

bash
print(client.get_chat_history("web-search")['data'])

获取最新前几个bot聊天数据

bash
print(client.get_chat_history(count=5)['data'])

获取某个bot的最新前几个聊天数据

bash
print(client.get_chat_history(bot="gpt3_5",count=2)['data'])

获取订阅信息和积分

bash
data = client.get_settings() print(data)

发送消息和流式响应

bash
bot = "gpt3_5" message = "鲁迅是周树人吗?" # Streamed example: for chunk in client.send_message(bot, message): #     print(chunk["response"], end="", flush=True) # print("\n")

获取当前聊天的id、code和价格

bash
# Streamed example: for chunk in client.send_message(bot, message): print(chunk["response"], end="", flush=True) print("\n") # You can get chatCode and chatId of created thread to continue the conversation chatCode = chunk["chatCode"] chatId = chunk["chatId"] # You can also retrieve msgPrice msgPrice = chunk["msgPrice"] print(f'chatCode={chatCode},chatId={chatId},msgPrice={msgPrice}')

向**现有聊天线程(会话)**发送消息

bash
# 1. Using chatCode for chunk in client.send_message(bot, message, chatCode="2i58ciex72dom7im83r"): print(chunk["response"], end="", flush=True) # 2. Using chatId for chunk in client.send_message(bot, message, chatId=59726162): print(chunk["response"], end="", flush=True)

发送并发消息

bash
import time, threading thread_count = 0 def message_thread(prompt, counter): global thread_count try: for chunk in client.send_message("gpt3_5", prompt): pass print(prompt+"\n"+chunk["text"]+"\n"*3) thread_count -= 1 except Exception as e: pass prompts = [ "Write a paragraph about the impact of social media on mental health.", "Write a paragraph about the history and significance of the Olympic Games.", ] for i in range(len(prompts)): t = threading.Thread(target=message_thread, args=(prompts[i], i), daemon=True) t.start() thread_count += 1 time.sleep(1) while thread_count: time.sleep(0.01)

尝试再次发送上次的prompt

bash
for chunk in client.retry_message(chatCode='3q5ms8rehspbkbs2bt8'): print(chunk['response'], end='', flush=True)

检索建议的答复

bash
for chunk in client.send_message(bot, "Introduce 5 books about clean code", suggest_replies=True): print(chunk["response"], end="", flush=True) print("\n") for reply in chunk["suggestedReplies"]: print(reply)

删除历史聊天线程

bash
# Delete 1 chat # Using chatCode client.delete_chat(bot, chatCode="2i58ciex72dom7im83r") # Using chatId client.delete_chat(bot, chatId=59726162) # Delete n chats # Using chatCode client.delete_chat(bot, chatCode=["LIST_OF_CHAT_CODES"]) # Using chatId client.delete_chat(bot, chatId=["LIST_OF_CHAT_IDS"]) # Delete all chats of a bot client.delete_chat(bot, del_all=True)

清除对话上下文

bash
# 1. Using chatCode client.chat_break(bot, chatCode="2i58ciex72dom7im83r") # 2. Using chatId client.chat_break(bot, chatId=59726162)

清除用户所有信息(慎用!意味着之前的提问会全部消失!

bash
client.purge_all_conversations()

获取以前的消息

bash
# 获取某个bot以前的几条消息 previous_messages = client.get_previous_messages(bot, chatCode='3q4q0jp7o8jnl7dyvro', count=5) for message in previous_messages: print(message) # 获取某个bot以前的全部消息 previous_messages = client.get_previous_messages(bot, chatCode='3q4q0jp7o8jnl7dyvro', get_all=True) for message in previous_messages: print(message)
目录